to Increase the Reproducibility of Your Research
12 February 2026
RESEARCHER A:
“I baked this cake…
YOU:
“I want that too! So I’ll
RESEARCHER A:
“I baked this cake…
YOU:
“I want that too! So I’ll
| Same Data | Different Data | |
|---|---|---|
| Same Analysis |
Reproducible | Replicable |
| Different Analysis | Robust | Generalizable |
The cumulative nature of science fundamentally depends on researchers building upon others’ findings (merton.1973?)
“In principle, all reported evidence should be reproducible” (Nosek et al., 2022, p. 721)
Artner et al. (2021): 232 scientific claims from 46 journal articles
Crüwell et al. (2023):
All articles from one issue in Psychological Science
Hardwicke et al. (2021):
Articles, open data badge (Psychological Science, 2014-2015)
Obels et al. (2020):
36 registered reports that shared both, code and data
Renaming files
Hard-coding file paths
copy-paste errors
wrong rounding
Old package versions
Non-standardized computational environment (e.g., Older software versions)
Joint analyses
with co-authors
Reviewers check
your analyses.
Further analysis
after review
Recalculation
for meta-analysis
System-independent executable
Executable
free of charge
Comprehensible for yourself and others
Executable
over the long term
Compute
environment
control
Cost-free
software
Literate
programming
Compute
environment
control
The Workflow for Open Reproducible Code in Science (WORCS) is an excellent framework that also integrates recommendations I give here. (vanlissa.2021?)
Remaining challenges:
Educational research: Often multi-step procedures with various software packages
Not reproducible ←―――――――――――――→ Gold standard
Concept by Knuth (1984):
Modern implementation: - Quarto - R Markdown - Jupyter Notebooks
Left side: Markdown + Code blocks
Right side: Rendered document
For science:
For you:
Problem: Different software versions, operating systems
Solutions:
Three lines of code change:
Result:
Science is fundamentally cumulative (Merton 1973):
Each step requires:
Artner et al. (2021) reproduction study:
Their conclusion: “Vagueness makes assessing reproducibility a nightmare”
The problem: Inadequate documentation and data management
FAIR Principles provide structure for:
Not just “making data available” but making it systematically reusable
Reproducible reporting solves one problem:
But another problem remains:
→ FAIR principles address these questions
Real examples of reproducibility barriers:
Availability alone ≠ Reproducibility
Framework for systematic practices that enable reuse
Principle: “As open as possible, as closed as necessary”
Health research (Martínez-García et al. 2023):
Reproducibility study (Artner et al. 2021):
Components:
Repositories: - Zenodo.org - OSF.io - Research data centers (e.g., Verbund FDB)
Gradations:
Important: Transparent documentation of access path
Use of standardized, open formats:
Alternative open-source software:
Comprehensive documentation:
Standards: PSYCH-DS for psychological data
Good examples:
Challenge: Scalable approaches for smaller projects with limited resources
Answer:
Initial investment: Yes, there’s a learning curve
Long-term benefits:
Think incremental: Start small, improve gradually
Answer:
The paradox of open code:
Perfect is the enemy of good: Imperfect but documented code > No code at all
Answer:
FAIR ≠ Open:
For educational research:
Answer:
Your advantages persist:
Reframing:
Science as common good: Collective progress benefits everyone
Reproducibility is not optional—it’s core to science:
Two practical approaches:
Result: Stronger, more credible, more impactful research
Reproducibility means:
Start today:
Perfect is the enemy of good—begin somewhere! 💪
Questions?
Complete reference list available in the paper.
Title image by rael frames on Unsplash
Olaf (from Frozen): dailymail.co.uk
FAIR-Logo: SangyaPundir on wikimedia commons
Slides: t1p.de/gebf25-os